Theoretical properties of functional Multi Layer Perceptrons

نویسندگان

  • Fabrice Rossi
  • Brieuc Conan-Guez
  • François Fleuret
چکیده

In this paper, we study a natural extension of Multi Layer Perceptrons (MLP) to functional inputs. We show that fundamental results for numerical MLP can be extended to functional MLP. We obtain universal approximation results that show the expressive power of functional MLP is comparable to the one of numerical MLP. We obtain consistency results which imply that optimal parameters estimation for functional MLP is consistent.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Implementation of Nested Rectangular Decision Regions by Multi-layer Perceptrons I: Algorithm

There are three papers in a series of discussions related to the partitioning capabilities on nested rectangular decision regions using multi-layer perceptrons. We propose a constructive algorithm, called the up-down algorithm, to realize the nested rectangular decision regions. The algorithm determines the weights easily and can be generalized to solve other decision regions with the propertie...

متن کامل

Multi-layer Perceptrons for Functional Data Analysis: A Projection Based Approach

In this paper, we propose a new way to use Functional MultiLayer Perceptrons (FMLP). In our previous work, we introduced a natural extension of Multi Layer Perceptrons (MLP) to functional inputs based on direct manipulation of input functions. We propose here to rely on a representation of input and weight functions thanks to projection on a truncated base. We show that the proposed model has t...

متن کامل

Multi-Layer Perceptrons with B-Spline Receptive Field Functions

Multi-layer perceptrons are often slow to learn nonlinear functions with complex local structure due to the global nature of their function approximations. It is shown that standard multi-layer perceptrons are actually a special case of a more general network formulation that incorporates B-splines into the node computations. This allows novel spline network architectures to be developed that c...

متن کامل

Advanced Supervised Learning in Multi - layer Perceptrons - From

Computer Standards and Interfaces Special Issue on Neural Networks (5), 1994 Advanced Supervised Learning in Multi-layer Perceptrons From Backpropagation to Adaptive Learning Algorithms Martin Riedmiller Institut f ur Logik, Komplexit at und Deduktionssyteme University of Karlsruhe W-76128 Karlsruhe FRG [email protected] Abstract| Since the presentation of the backpropagation algorithm [1] a ...

متن کامل

A Pilot Sampling Method for Multi-layer Perceptrons

As the size of samples grows, the accuracy of trained multi-layer perceptrons grows with some improvement in error rates. But we cannot use larger and larger samples, because computational complexity to train the multi-layer perceptrons becomes enormous and data overfitting problem can happen. This paper suggests an effective approach in determining a proper sample size for multi-layer perceptr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002